Skip to content

feat: add llama.cpp \ ollama as rerank provider#363

Open
NW15D wants to merge 1 commit intoCortexReach:masterfrom
NW15D:feature/llamacpp-rerank-provider
Open

feat: add llama.cpp \ ollama as rerank provider#363
NW15D wants to merge 1 commit intoCortexReach:masterfrom
NW15D:feature/llamacpp-rerank-provider

Conversation

@NW15D
Copy link
Copy Markdown

@NW15D NW15D commented Mar 26, 2026

  • Add 'llamacpp' to RerankProvider type and config
  • Implement buildRerankRequest for llama.cpp API (optional API key)
  • Implement parseRerankResponse for llama.cpp format
  • Support local reranking without cloud API
  • Add comprehensive test suite (5 tests)
  • Update documentation with llama.cpp examples

TBD add llamacpp to openclaw.plugin.json

Closes #340

- Add 'llamacpp' to RerankProvider type and config
- Implement buildRerankRequest for llama.cpp API (optional API key)
- Implement parseRerankResponse for llama.cpp format
- Support local reranking without cloud API
- Add comprehensive test suite (5 tests)
- Update documentation with llama.cpp examples

Closes CortexReach#340
@NW15D NW15D changed the title feat: add llama.cpp as rerank provider feat: add llama.cpp \ ollama as rerank provider Mar 26, 2026
@rwmjhb
Copy link
Copy Markdown
Collaborator

rwmjhb commented Mar 28, 2026

这个改动方向没问题,但我建议把公开配置面一起补齐:现在 src/retriever.ts 和 README 都已经支持/使用 rerankProvider: "llamacpp",但 openclaw.plugin.json 的 schema 和 index.ts 里的 PluginConfig 还没同步这个值。既然这是要作为正式配置能力暴露给用户,最好一起补完整。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature Request: Support Ollama as a Rerank provider

2 participants